161 research outputs found

    Stability and sensitivity of Learning Analytics based prediction models

    Get PDF
    Learning analytics seek to enhance the learning processes through systematic measurements of learning related data and to provide informative feedback to learners and educators. Track data from Learning Management Systems (LMS) constitute a main data source for learning analytics. This empirical contribution provides an application of Buckingham Shum and Deakin Crick’s theoretical framework of dispositional learning analytics: an infrastructure that combines learning dispositions data with data extracted from computer-assisted, formative assessments and LMSs. In two cohorts of a large introductory quantitative methods module, 2049 students were enrolled in a module based on principles of blended learning, combining face-to-face Problem-Based Learning sessions with e-tutorials. We investigated the predictive power of learning dispositions, outcomes of continuous formative assessments and other system generated data in modelling student performance and their potential to generate informative feedback. Using a dynamic, longitudinal perspective, computer-assisted formative assessments seem to be the best predictor for detecting underperforming students and academic performance, while basic LMS data did not substantially predict learning. If timely feedback is crucial, both use-intensity related track data from e-tutorial systems, and learning dispositions, are valuable sources for feedback generation

    The Potential for Student Performance Prediction in Small Cohorts with Minimal Available Attributes

    Get PDF
    The measurement of student performance during their progress through university study provides academic leadership with critical information on each student’s likelihood of success. Academics have traditionally used their interactions with individual students through class activities and interim assessments to identify those “at risk” of failure/withdrawal. However, modern university environments, offering easy on-line availability of course material, may see reduced lecture/tutorial attendance, making such identification more challenging. Modern data mining and machine learning techniques provide increasingly accurate predictions of student examination assessment marks, although these approaches have focussed upon large student populations and wide ranges of data attributes per student. However, many university modules comprise relatively small student cohorts, with institutional protocols limiting the student attributes available for analysis. It appears that very little research attention has been devoted to this area of analysis and prediction. We describe an experiment conducted on a final-year university module student cohort of 23, where individual student data are limited to lecture/tutorial attendance, virtual learning environment accesses and intermediate assessments. We found potential for predicting individual student interim and final assessment marks in small student cohorts with very limited attributes and that these predictions could be useful to support module leaders in identifying students potentially “at risk.”.Peer reviewe

    Understanding academics’ resistance towards (online) student evaluation

    Get PDF
    Many higher educational institutions and academic staff are still sceptical about the validity and reliability of student evaluation questionnaires, in particular when these evaluations are completed online. One month after a university-wide implementation from paper to online evaluation across 629 modules, (perceived) resistance and ambivalence amongst academic staff were unpacked. A mixed-method study was conducted amongst 104 academics using survey methods and follow-up semi-structured interviews. Despite a successful ‘technical’ transition (i.e. response rate of 60%, similar scores to previous evaluations), more than half of respondents reported a negative experience with this transition. The results indicate that the multidimensional nature of ambivalence towards change and the dual nature of student evaluations can influence the effectiveness of organisational transition processes

    Eliciting students' preferences for the use of their data for learning analytics. A crowdsourcing approach.

    Get PDF
    Research on student perspectives of learning analytics suggests that students are generally unaware of the collection and use of their data by their learning institutions, and they are often not involved in decisions about whether and how their data are used. To determine the influence of risks and benefits awareness on students’ data use preferences for learning analytics, we designed two interventions: one describing the possible privacy risks of data use for learning analytics and the second describing the possible benefits. These interventions were distributed amongst 447 participants recruited using a crowdsourcing platform. Participants were randomly assigned to one of three experimental groups – risks, benefits, and risks and benefits – and received the corresponding intervention(s). Participants in the control group received a learning analytics dashboard (as did participants in the experimental conditions). Participants’ indicated the motivation for their data use preferences. Chapter 11 will discuss the implications of our findings in relation to how to better support learning institutions in being more transparent with students about the practice of learning analytics

    Eliciting students' preferences for the use of their data for learning analytics. A crowdsourcing approach.

    Get PDF
    Research on student perspectives of learning analytics suggests that students are generally unaware of the collection and use of their data by their learning institutions, and they are often not involved in decisions about whether and how their data are used. To determine the influence of risks and benefits awareness on students’ data use preferences for learning analytics, we designed two interventions: one describing the possible privacy risks of data use for learning analytics and the second describing the possible benefits. These interventions were distributed amongst 447 participants recruited using a crowdsourcing platform. Participants were randomly assigned to one of three experimental groups – risks, benefits, and risks and benefits – and received the corresponding intervention(s). Participants in the control group received a learning analytics dashboard (as did participants in the experimental conditions). Participants’ indicated the motivation for their data use preferences. Chapter 11 will discuss the implications of our findings in relation to how to better support learning institutions in being more transparent with students about the practice of learning analytics

    Analytics in online and offline language learning environments: the role of learning design to understand student online engagement

    No full text
    Language education has a rich history of research and scholarship focusing on the effectiveness of learning activities and the impact these have on student behaviour and outcomes. One of the basic assumptions in foreign language pedagogy and CALL in particular is that learners want to be able to communicate effectively with native speakers of their chosen language. Combining principles of learning analytics and Big Data with learning design, this study used a student activity based taxonomy adopted by the Open University UK to inform module design. The learning designs of four introductory and intermediary language education modules and online engagement of 2111 learners were contrasted using weekly learning design data. In this study, we aimed to explore how learning design decisions made by language teachers influenced students’ engagement in the VLE. Using fixed effect models, our findings indicated that 55% of variance of weekly online engagement in these four modules was explained by the way language teachers designed weekly learning design activities. Our learning analytics study highlights the potential affordances for CALL researchers to use the power of learning design and big data to explore and understand the complexities and dynamics of language learning for students and teachers

    Introducing innovative technologies in higher education: An experience in using geographic information systems for the teaching‐learning process

    Get PDF
    In today's world, new technologies are being used for the teaching‐learning process in the classroom. Their use to support learning can provide significant advantages for the teaching‐learning process and have potential benefits for students, as many of these technologies are a part of the work life of many current professions. The aim of this study is to analyse the use of innovative technologies for engineering and science education after examining the data obtained from students in their learning process and experiences. The study has been focused on computational geographic information systems, which allow access to and management of large volumes of information and data, and on the assessment of this tool as a basis for a suitable methodology to enhance the teaching‐learning process, taking into account the great social impact of big data. The results allow identifying the main advantages, opportunities, and drawbacks of using these technological tools for educational purposes. Finally, a set of initiatives has been proposed to complement the teaching activity and to improve user experience in the educational field.This study was supported by the Spanish Research Agency and the European Regional Development Fund under project CloudDriver4Industry TIN2017‐89266‐R
    • 

    corecore